651 research outputs found

    Análisis de la eficiencia de un programa de valoración no presencial de consultas preferentes en aparato digestivo

    Get PDF
    En Castilla y León se puso en marcha en el periodo 2015 - 2019 un Plan Estratégico de Eficiencia y de Control y Reducción de las Listas de Espera (PERYCLES) cuyo objetivo es mejorar las listas de espera y las demoras, priorizar en función de la gravedad y de la urgencia y conseguirlo de una forma eficiente y sostenible. Una de las herramientas recomendadas para conseguir dichos objetivos es la implementación de consultas preferentes no presenciales. El principal objetivo de este trabajo es analizar la eficiencia de la consulta de valoración preferente no presencial en Aparato Digestivo, para lo cual se ha diseñado un estudio retrospectivo, transversal y descriptivo en el que se incluyeron todos los pacientes derivados por el especialista en Medicina de Familia a la Consulta Preferente no Presencial de Aparato Digestivo, desde Julio de 2017 hasta el 31 de diciembre de 2018. Se han utilizado como variables de estudio los datos demográficos (edad, sexo, ámbito de residencia y hábitos tóxicos) datos de derivación a consulta (fecha de derivación, motivo de derivación, exploraciones previas a la derivación e historia previa en Digestivo) y actitud tomada tras la valoración en la consulta (decisión tomada, fecha de cita presencial, exploraciones solicitadas desde Digestivo). Se valoraron un total de 698 consultas no presenciales en las que el 55.9 % eran mujeres, la edad media fue de 58 19 años y un 53.3 % de los pacientes residía en el ámbito urbano. Del total de pacientes valorados un 44.9 % tenía historia previa en el Servicio de Digestivo. El motivo de derivación más frecuente fue el dolor abdominal (20.4 %) y el 35 % de los pacientes fue derivado tras la realización únicamente de una analítica. Tras la consulta no presencial se decidió citar vía normal a un 32.2 % de los pacientes, que esperaron una media de 54 días hasta la consulta presencial; citar preferente al 21.8%, que esperaron una media de 16 días; citar tras exploraciones solicitadas al 6.6 %, que esperaron una media de 63 días; no citar al 23.8 % y se solicitó una exploración y no se citó al paciente en el 13.5% de los casos. Tras la consulta presencial, la exploración complementaria más solicitada fue la endoscopia (54%). Un 4% de los pacientes fallecieron en el seguimiento. Tras comparar los resultados con otros estudios y con los objetivos iniciales del PERYCLESS, podemos concluir que la valoración preferente no presencial es una herramienta de gestión útil para priorizar la atención en consulta de digestivo a los pacientes en función de la gravedad, habiendo disminuido el número de citas presenciales, y mejorando la accesibilidad de aquellos que precisaban valoración presencial preferente o vía normal.Grado en Medicin

    SerAPI: Machine-Friendly, Data-Centric Serialization for COQ: Technical Report

    Get PDF
    We present SerAPI, a library and protocol for machine-friendly communication with the COQ proof assistant. SerAPI is implemented using Ocaml's PPX pre-processing technology, and it is specifically targeted to reduce implementation burden for tools such as Integrated Development Environments or code analyzers. SerAPI tries to address common problems that tools interacting with COQ face, providing a uniform and data-centric way to access term representations, proof state, and an extended protocol for document building. SerAPI is work in progress but fully functional. It has been adopted by the jsCoq and PeaCoq Integrated Development Environments , and supports running inside a web browser instance. For the near future, we are focused on extending the document protocol and providing advanced display abilities to clients

    Optimal investment and hedging under partial and inside information

    Get PDF
    This article concerns optimal investment and hedging for agents who must use trading strategies which are adapted to the filtration generated by asset prices, possibly augmented with some inside information related to the future evolution of an asset price. The price evolution and observations are taken to be continuous, so the partial (and, when applicable, inside) information scenario is characterised by asset price processes with an unknown drift parameter, which is to be filtered from price observations. We first give an exposition of filtering theory, leading to the Kalman-Bucy filter. We outline the dual approach to portfolio optimisation, which is then applied to the Merton optimal investment problem when the agent does not know the drift parameter of the underlying stock. This is taken to be a random variable with a Gaussian prior distribution, which is updated via the Kalman filter. This results in a model with a stochastic drift process adapted to the observation filtration, and which can be treated as a full information problem, and an explicit solution to the optimal investment problem is possible. We also consider the same problem when the agent has noisy knowledge at time 00 of the terminal value of the Brownian motion driving the stock. Using techniques of enlargement of filtration to accommodate the insider's additional knowledge, followed by filtering the asset price drift, we are again able to obtain an explicit solution. Finally we treat an incomplete market hedging problem. A claim on a non-traded asset is hedged using a correlated traded asset. We summarise the full information case, then treat the partial information scenario in which the hedger is uncertain of the true values of the asset price drifts. After filtering, the resulting problem with random drifts is solved in the case that each asset's prior distribution has the same variance, resulting in analytic approximations for the optimal hedging strategy

    Really Natural Linear Indexed Type Checking

    Full text link
    Recent works have shown the power of linear indexed type systems for enforcing complex program properties. These systems combine linear types with a language of type-level indices, allowing more fine-grained analyses. Such systems have been fruitfully applied in diverse domains, including implicit complexity and differential privacy. A natural way to enhance the expressiveness of this approach is by allowing the indices to depend on runtime information, in the spirit of dependent types. This approach is used in DFuzz, a language for differential privacy. The DFuzz type system relies on an index language supporting real and natural number arithmetic over constants and variables. Moreover, DFuzz uses a subtyping mechanism to make types more flexible. By themselves, linearity, dependency, and subtyping each require delicate handling when performing type checking or type inference; their combination increases this challenge substantially, as the features can interact in non-trivial ways. In this paper, we study the type-checking problem for DFuzz. We show how we can reduce type checking for (a simple extension of) DFuzz to constraint solving over a first-order theory of naturals and real numbers which, although undecidable, can often be handled in practice by standard numeric solvers

    Adventures in the (not so) Complex Space

    No full text
    International audienceWe report on the progress of a constructive mechanization for a small subset of signal processing theory, built upon the SSREFLECT and MATHCOMP libraries.The development was started to provide mechanized semantics for audio programming languages. Currently, we have formalized several standard properties of the Discrete Fourier Transform, such as its unitary matrix form and its power and convolution theorems. Future goals include transfer functions and constant overlap-add processing.At the workshop, we aim to discuss the needs and limits of our current approach, surveying some mathematical concepts not covered by existing libraries, and similar efforts in other frameworks and theorem provers

    The Ciao clp(FD) library. A modular CLP extension for Prolog

    Get PDF
    We present a new free library for Constraint Logic Programming over Finite Domains, included with the Ciao Prolog system. The library is entirely written in Prolog, leveraging on Ciao's module system and code transformation capabilities in order to achieve a highly modular design without compromising performance. We describe the interface, implementation, and design rationale of each modular component. The library meets several design goals: a high level of modularity, allowing the individual components to be replaced by different versions; highefficiency, being competitive with other TT> implementations; a glass-box approach, so the user can specify new constraints at different levels; and a Prolog implementation, in order to ease the integration with Ciao's code analysis components. The core is built upon two small libraries which implement integer ranges and closures. On top of that, a finite domain variable datatype is defined, taking care of constraint reexecution depending on range changes. These three libraries form what we call the TT> kernel of the library. This TT> kernel is used in turn to implement several higher-level finite domain constraints, specified using indexicals. Together with a labeling module this layer forms what we name the TT> solver. A final level integrates the CLP (J7©) paradigm with our TT> solver. This is achieved using attributed variables and a compiler from the CLP (J7©) language to the set of constraints provided by the solver. It should be noted that the user of the library is encouraged to work in any of those levels as seen convenient: from writing a new range module to enriching the set of TT> constraints by writing new indexicals

    Caracterización del oficio del Visitador Médico en Medellín: un análisis de la década 2002 al 2012

    Get PDF
    El presente artículo tiene el propósitode hacer una descripción detallada del oficio de la visita médica durante un periodo de diez años (2002-2012), con el ánimo de documentar una actividad sobre la cual poco se ha investigado en Medellín y por ello ha sido difícil comprender sus dinámicas,escenarios y vislumbrar su porvenir -- Para lograrlo, la metodología empleada abarcó cuatro pasos:revisión documental, realización de entrevistas semiestructuradas a los actores protagónicosdentro del escenario de la visita médica, procesamiento de la información, empleando ATLAS.ti como programa para el análisis cualitativo de datos, y finalmente, análisis y presentación de los resultados obtenidos -- Como resultado fue posible identificar cambios ocasionados por las transformaciones en el marco legal, la adopción de nuevas tecnologías de la comunicación y mayores exigencias en el desarrollo de las habilidades laborales y académica

    Plan de negocios para la creación de una sala de videojuegos tipo competitivo en la ciudad de Pereira

    Get PDF
    En el siguiente Proyecto estudia la viabilidad de la creación de una sala de videojuegos tipo competitivo en la ciudad de Pereira, a partir de la ausencia de estas y el crecimiento exponencial que tienen los E-sports a nivel mundial. Para comprender un desarrollo adecuado del estudio de mercados implementado, se empleó una serie de instrumentos investigativos tales como la encuesta, observación participativa, investigación cualitativa e investigación cuantitativa, análisis de datos entre otros; con el fin de determinar el mercado potencial y las necesidades de servicio que estos requerían. Se realizó un análisis financiero teniendo en cuenta los costos de inversión, sostenimiento, amortización, salarios e ingresos, tomando en cuenta una proyección a cinco años para definir la viabilidad del proyecto en el periodo de tiempo establecido. Por último se Realizó el diseño de la sala de video juegos para permitir una mejor percepción del proyecto

    A Taste of Sound Reasoning in Faust

    No full text
    International audienceWe address the question of what software verification can do for the audio community by showcasing some preliminary design ideas and tools for a new framework dedicated to the formal reasoning about Faust programs. We use as a foundation one of the strongest current proof assistants, namely Coq combined with SSReflect. We illustrate the practical impact of our approach via a use case, namely the proof that the implementation of a simple low-pass filter written in the Faust audio programming language indeed meets one of its specification properties. The paper thus serves three purposes: (1) to provide a gentle introduction to the use of formal tools to the audio community, (2) to put forward programming and formal reasoning paradigms we think are well suited to the audio domain and (3) to illustrate this approach on a simple yet practical audio signal processing example, a low-pass filter

    Bringing Theorem Proving to the (sonic) Masses

    No full text
    We explore the intersection of interactive theorem proving and digital signal processing through the use of web-based, rich interfaces. Traditionally, the barrier to entry to interactive theorem proving has been high.Provers are complex systems using obscure programming languages, and libraries may be underdocumented and use formalisms and notations far from the standard domain-specific practice. Thus, it doesn't come at a surprise that interactive theorem proving has seldom been explored in the the digital signal processing community.In previous work [1,2], we formalized several DSP tools and concepts using the Coq theorem prover [3]. [1] presents a simplified model Faust [4], a programming language tailored to audio DSP. The formalization allows to reason about Faust programs and prove typical properties like filter stability. In [2], we mechanized some theory of the Discrete Fourier Transform, following [5], and proving the main theorems. In our opinion, both developments are suitable as teaching material, either as an introduction to theorem proving or to students interested in DSP theory.However, in their current form, the student or reader must work with two versions of the document. The paper-and-ink version: which misses the interactive component of the theorem proving process, and the Coq proof scripts, that being plain text files usually miss convenient formatting and structure. This problem is also common in other teaching material (see for instance [6]), and while efforts to improve this situation have been made ([7] is one of the latest), the situation is still far from optimal, as most initiatives have been limited by implementation and platform choices.Our take on the problem --- inspired by the long tradition of interactive documents found in the computer music domain --- is to enrich the Coq document model to the full power of HTML5. Our goal is to have an integral solution, free of external components, so we have ported Coq to javascript, and we are writing a custom IDE for enriched proof documents.Thus, the once painful Coq installation and setup has become now a page load in the browser, document navigation and interaction can take advantage of browser support, and our documents can take advantage of most available JS libraries for extensibility. For instance, our model would allow to incorporate a javascript-based score like [8] and relate its output to a particular Coq datatype in a quite straightforward way.The primary design goal of the web-based user interface is supporting education and interactive papers, focusing on supporting excellent user feedback, comprehensive documentation search, and good notational/navigational facilities.The current prototype can be accessed at [9]. It supports full Coq functionality, but the user interface is still in heavy development. While support for the enriched document model won't likely be available at the time of this submission, the main technical difficulties of the development have been solved, thus we are confident on getting usable versions in the workshop timeframe
    corecore